In this section, we will discover what the editor is capable of. For that, we will reproduce what the tutorials within nkGraphics end up with, in the tutorial 07 about composition.
This will let us witness how transparently the API can be driven from the UI, and how easily we will be able to do that. At the end of this tutorial, we will reach a rendering comparable to :
Ready to start ? Then let's go !
The editor is available in the Bin folder of the release archive. Once launched, you will be greeted by something like :
A quick tour :
And that goes for the initial interface tour. From there, it is possible to access everything the editor has to offer.
Let's start by loading the basic resources we need :
Let's load the mesh. Open the Resources menu, and select the Meshes item :
This will make another window pop :
This design is used for resource windows in general :
For the meshes, in the version on which this tutorial is based, you can rename them, save their declaration files, change their source data, and witness / alter their attributes. Let's first create a resource. For that, click on the little "+" button you will find in the bottom left part of the window, under the resource list :
This will open a sub window :
Once more, you will find this window for each resource type available. On the top, you can specify a declaration file to load from to create the resource(s). The bottom part allows you to create a resource from scratch by specifying its name. Let's do that, by entering its name "Sphere", and hitting enter or the button next to the field. It is now in the resource list and if we click on it :
As you can see, the interface is now filled with what the mesh has to offer to us. It is named "Sphere", has no data path set yet, and its data has only the vertex position. While it can be surprising, this is true in this context : our mesh can be used right away, albeit the fact it is empty. But that would not be very fun to see, so let's search for a file to load. Hit the "..." button within the Source group, and search for "sphere.obj", within the Data/Meshes folder of the release.
Once done, the interface will be updated :
The Source data path is now updated. Also, we witness new attributes provided by the mesh : positions, normals, uvs. This is what the mesh has available in its data, and it has been detected by the editor. API wise, what we did is create a mesh, and set its resource path. The loading is managed by the editor itself.
Next step will be to add it to the scene. For that, we can close the mesh window and go back to the main window. Then, right click within the Scene Tree list :
Altering the scene tree goes through right clicking on items within the list. If you click on "Add Node", you will get a new item in the list, Node_0.
Right clicking on this new item will hold new possibilities. Select "Add Renderable".
On this new "Entity" item right click again to "Append Mesh".
In the list window popping, select the mesh, "Sphere", and hit ok.
The scene tree list should look like :
This hides, if we were doing it through the API, setting an entity in the render queue, adding a mesh in its sub entity, and pluging it to a node. And believe it or not, but the sphere is actually rendered in the 3D Graphics window !
If you read the nkGraphics tutorial series, you probably remember that the sphere is centered around 0, which is exactly where our camera starts. While our reflex in nkGraphics was to move the node, in the editor, no need for that. We will move the camera rather. We will use the input pattern available in the editor :
This camera controller scheme is quite similar to the way you would control a free camera within a First Person Shooter game. So, to move our camera away from the sphere, we need to :
You should see something similar to :
By then, using the sphere as a visual reference, feel free to test a bit the controller and how it feels. Once ready, let's proceed !
Keep the mesh creation process in mind, as it illustrates how resources are created in the editor.
With that, it should be fairly easy to create the texture.
Open the window, through the menu bar : Resources > Textures.
Click the "+" button, specify the name, and select it within the list.
Next thing we want to do is load the cubemap. For that, click on the "Open File..." button in the bottom right and search for the Data/Textures/YokohamaNight.dds texture.
Once done, you should be able to see that the texture type has been updated to TEX_CUBEMAP. Its dimensions and format are also deducted. The texture is ready to be used !
The API process hidden is creating, setting the resource path, and loading.
Like any other resources, we need to create it through the dedicated "Shader Programs" window.
Close the Texture window. In the menu bar : Resources > Programs.
Let's create 2 programs, one for the sphere and one for the environment behind. For each of them, click on the "+" button under the list, and name them as wanted.
We will first work with the one for the sphere. Select the program we will work on.
Upon selection, the program will be flagged by default as a "Pipeline" program, with default simple HLSL sources. A program in the editor needs to be of a certain type, depending on the usage you will have for it :
Now, we want to assign this shader to an object, so let's keep it as a pipeline program. In each group, you will notice the program stages that can be written (vertex, domain...). We will need the vertex and pixel stages. Let's edit the vertex stage, by clicking the "Edit" button next to it :
A new window will pop, with a default shader sources prefilled :
The menu bar specifies the name of the program being edited.
On the top, each tab corresponds to a program stage. When switching, you will find the current active source for given stage.
The macros group allow you to specify macros to the program compilation.
The sources group speaks for itself, and holds the sources of the program.
The "Nothing to show." part is the compilation log window, notifying potential errors while recompiling using the associated button, "Recompile" in the bottom right.
Let's edit the sources of our vertex stage to match :
Now switch to the pixel stage tab, and change its sources to :
And hit the recompile button. The compilation should run fine, as reported by the log window :
On this part, the editor does a lot for us, by prefilling information and ensuring the program is in a good usable state.
However, even with that, you can maybe recognize the API : create the program, update its source from memory, and load.
For the program to be used, we need the shader. Let's create it through its dedicated interface, with the usual process.
Close the program window, and hit Resources > Shaders.
Now hit "+" in the existing resources group, specify the name, and select it.
You will have a window like this :
The shader window can seem intimidating at first, but it is axed towards two aspects :
With that in mind, the interface should be clearer. On the top, set the program. The type will be deduced from the program type, as a reminder.
Then, in the tab window in the bottom, this is where you will find the information about what the shader feeds to the program.
It is organized by resource type :
First step, then, is to set the program to use.
Click on the "..." button in the program group, and select the program it should use :
Now, we need to specify which resources to feed it.
First, ensure the Textures tab is active.
Process to add a resource is as such : through the sub-window, add a resource using the "+" button. It will be added with a default value.
Click on it, and change all parameters needed to make it feed what we need.
For instance, for our texture, add it, click on it, and click the "..." button after the Texture label. Select the environment map, and validate :
Which should result in something like :
Then, go to the Samplers tab and add one sampler. The default one will be sufficient, so no need to edit anything.
Next we need to add a constant buffer. Switch to the Constant Buffers tab :
As with every other resources, click the "+" in the Constant Buffers group to add a buffer. Select it in the list, which will update the rest of the interface :
A freshly created buffer has no pass slot, so we need to add some from its dedicated list. So, first, add the slot through the "+" button. Select the slot created, which is by default a vector, and alter the types to fit the HLSL sources :
The type names are directly mapped to the enumeration values within nkGraphics.
With the type set, we don't need to alter any parameter in the slots.
Repeat the process to fit what the cbuffer expects within the program. In the end, you should get something like :
The order is important as the resource is fed sequentially, from top to bottom.
We need a VIEW_MATRIX, a PROJ_MATRIX, and a CAM_POSITION slots.
Finally, go into the Instance Buffers tab and add a slot, which will automatically be the World Matrix we need :
Once this is added, our shader is using the program and providing all the input it needs. We can go forward and use the shader !
Now we need to use the shader with the mesh currently shown. You probably saw it passing by, what we now need is to set the shader via the scene tree. Close the shader window, switch to the main window, right click on the Entity, and assign the pipeline shader we just created through the list :
The shader we need to alter is the "pipeline" one.
It is the shader used for a scene rendering pass using rasterization, which is what is currently done.
The raytracing shader is the one used during a raytracing pass, which is something we will discover later.
The rendering in the 3D window should now be drastically different :
What is left before altering the composition is to create and setup the shader and program for the background.
First, reopen the program window (Resources > Programs).
Open the window, create a new program (click on "+" and select it in the list), leave it as a pipeline program and open the sources for the vertex stage.
Replace them by :
Switch to the pixel stage tab, and replace the sources by :
We won't go over it again here, please consult the tutorial about composition (07) within nkGraphics for that. Just know that the heart of it is using the post processing pass particularities to easily deduce a direction in the world for a screen pixel, and sample the environment map from there. Hit the recompile button, and ensure everything is done without any problem.
It is time to create the attached shader. Close the program window, and search for Resources > Shaders in the main window menu bar.
In the window, create a new shader ("+" and select in the list), set its program ("..." in the program group) to the one just recompiled.
Add the texture (Textures tab, "+" in the texture list, select it, and "..." to set its source to the environment map).
Add the sampler (Samplers tab, "+" in the sampler list) and leave the default one as active.
Add the constant buffer and its slots (Constant Buffers tab, "+" in the buffer list, select it, add a slot through "+", and change its type in the settings).
The pass slot type required is the camera direction in world space (CAM_DIRECTION_WORLD).
With this, the shader is ready ! What is left is creating the compositor, and using it for rendering.
Compositors are like any other resource, and we need to open its window to manipulate them. This window might seem more complex on first sight, but it directly maps over what the API offers. As such, if you know the API, you should be able to understand what it is about and manipulate it right away. If not, let's see the window in detail :
The window presents all aspects of a compositor, presented in one window. It reads from up left to down right.
Left is the list of compositors, as for every resources. Then comes the name control, and the heart of the compositor API. From there, you can add a node to the list. Upon selection, the node settings, activity and target operations, become visible in the next group box. Adding a target operation and selecting it leads to the next group, from which you can set the targets, viewport, and passes. Adding a pass and selecting it leads to the next group again, at the start of next line. Its type can be changed, and depending on its type, adapted settings can be tweaked within the last group on the right.
Once a compositor fits your need, its declaration can be saved through the dedicated group button.
From the UI, we need to reproduce the compositor done within the tutorial of nkGraphics. The process will be as such :
The process is very close to the API :
Once setup, we need to tell our editor to use the compositor for its active rendering. For this we need to access the rendering logic window through the Scene menu :
we are greeted by this window, in which we need to change the active compositor and select the one we created in the list :
Once this is done, the rendering window should reflect that :
Now that we've done all of this and are satisfied with the way it looks, it would be wise to save it to be able to reopen and iterate over it quickly.
In the main window menu : File > Save.
In the browser created, find the folder in which you want to save, and validate.
The saving process will create some files and folders, it is advised to save in a dedicated folder to keep structure clear.
And that's it ! The current setup is saved. Now, when opening the editor again, just hit File > Load and open back the project file saved. The scene will be loaded again and you will be able to start from where you left off.
As you probably noted, the editor is driving the API in a very transparent way. The steps closely emulate the lines of code we would need to write in a program.
However, as low level this can seem, this allows big control over the rendering we want.
The UI on top of that adds a lot of comfort, and allows to prototype and iterate way quicker than during a standard C++ application development process.
And with that, the editor still has some functionalities we have to unveil. Isn't that pretty ?